Written by Paula Malins

 

Finding the sweet spot is every golfer’s dream. Creating a sweet-spot-o-meter would be a joy for equipment manufacturers. And, scientifically defining it is the focus of sport engineer Paul Lückemann’s current research.

 

What is the sweet spot?

The sweet spot is the tiny area on the face of a golf club that gives the greatest transfer of energy between the club and the ball, ensuring the best shot in terms of distance and accuracy. 

The complex dynamics of the club-ball energy transfer – club design, player technique, even weather conditions – makes calculating the sweet spot’s location hugely demanding.

 

Paul’s research

Paul is developing an intelligent system for analysing golf swing that will offer new insights into player technique and support innovation in golf club engineering – including pinpointing that elusive sweet spot.

His approach measured two key dynamic indicators (or points of contact between clubhead and ball) during impact by tracking the clubhead rotation.

Using a high-speed multi-camera system, he tracked a driver clubhead at 20,000 frames per second. To provide consistency and accuracy, the swings were performed by a golf robot at 45 metres per second – eight for both dynamic indicator locations to calculate the mean and standard deviation of the clubhead angular velocity (the rate at which it rotates during the swing and impact).

The resulting data were used to model and calculate the sweet spot in this scenario – and an algorithm for use with other types of clubs and swing, established.

 

Ongoing research

Paul is now working to deliver software that interfaces with an existing motion capture system to analyse golf swing and calculate relevant swing metrics.

He also plans to study the effect of clubhead speed and mass distribution on the sweet spot indicators which will pinpoint the sweet spot for any given golf swing.

Sweet news for golfers and sport engineers everywhere.

 

Find out more

More about Paul and his research

https://www.lboro.ac.uk/study/postgraduate/student-stories/paul/

Read Paul’s research paper

https://docs.lib.purdue.edu/cgi/viewcontent.cgi?article=1021&context=resec-isea

Welcome to the 2021 Annual Report from the EPSRC Centre for Doctoral Training in Embedded Intelligence. You can read here what our students and researchers have been working on this year, of great challenges due to the global pandemic. Despite that, there are great stories inside.

You can read the report about the Centre here. 

This Annual Report gives an overview of all the activities for the EPSRC Centre for Doctoral Training in Embedded Intelligence for 2021. It includes information about our:

  • Cohort of students: graduated researchers, those wrapping up their degree now, and the active cohorts.
  • Training and Dissemination of Research events attended by our students.
  • Stories from our graduating students; their reflections on what their PhD journey was and how they experienced it.
  • Our Alumni: where they are now
  • Selected Publications
  • Advocacy of the Centre for Manufacturing and Digital technologies.  

If you have any questions about what you read today, or would like to know more, do not hesitate to get in touch. 

Shaun Smith of cohort 4 recently spoke at Dynamics Days Europe 2018. Dynamics Days Europe is a series of major international conferences that provides a forum for developments in the interdisciplinary research of nonlinear science. The conference is hosted in different locations in Europe each year, and this year Loughborough University welcomed delegates in fields including physics, engineering, biology and mathematics to discuss their research.

Shaun contributed a talk on how numerical continuation can be applied to complement engine calibration. He provides a brief overview of the talk and his experience at the conference:

"My talk was centred on how tools from nonlinear dynamics could be applied to complement the process of “engine mapping". Engine mapping is the process used by manufacturers to understand the behaviour of a system by relating inputs (e.g. throttle & torque) to outputs (e.g. speed & air pressure) by running desktop simulations at almost every possible combination of inputs. For larger systems with multiple inputs and outputs, this process quickly becomes very computationally expensive, so my supervisors and I have been working on an alternative approach to this process which combines tools from nonlinear science and numerical continuation. As well as offering an efficient overview of the system, we show additional benefits these tools can offer by providing insight about the dynamic behaviour of the system that would be very difficult to obtain through engine mapping alone.”

“Having joined the CDT almost exactly a year ago, Dynamics Days Europe 2018 provided an excellent opportunity to discuss the research undertaken in my first year. Presenting our approach to researchers in the field of nonlinear science, viewing talks on a variety of different topics and discussing the innovation of nonlinear dynamics with delegates were valuable experiences for me, and I look forward to attending future conferences on nonlinear science."

Dynamics Days Europe 2019 will be in Rostock, Germany September 2-6, 2019

For more information on Dynamics Days, please visit http://www.dynamicsdays.org

For details of the 2018 conference, see http://dynamicsday2018.lboro.ac.uk/index.html

Cohort 4 researcher,Temi Jegede, attended the 'High Performance Powertrains' Seminar hosted by the Institute of Mechanical Engineers which took place on the 22nd of May in Birmingham. 

We asked Temi to give a summary of the proceedings.

The chairperson, Professor Jamie Turner from the University of Bath began setting the scene by summarizing the aim of the seminar which was to highlight promising technologies that better optimize the performance of automotive powertrains. This was followed by the introduction of speakers from companies such as Ford, Mahle, Cosworth, AVL amongst many others.

Paul Freeland, a principal engineer at Cosworth spoke on the techniques employed by Cosworth to maintain the highest possible specific power output while minimising fuel consumption. Higher specific output in this case relates to increased engine speed and increased cylinder air charge which directly proportional to the amount of torque the engine can produce. With regards to fuel efficiency improvements, Cosworth have implemented techniques to maximise compression ratio while minimising frictional, pumping and heat losses. Some of these include dual cam phasing, upvalve systems, which help to reduce pumping losses, Variable displacement oil pump, roller-element valve actuation, Plasma-Sprayed Cylinder Bores which help to reduce frictional losses. Most of the benefit was made due to cylinder deactivation. This involves the temporary deactivation of one or more engine cylinders in light load operational regions. All of this together leads thermal efficiencies greater than 30% in more than 80% of the operating space of the engine.

Speakers from both Ford and JLR described approaches that were similar to those taken by Cosworth with minor differences.

It was also interesting to see the use of software simulation to reduce engine development time, Massimo Gallbati, a project manager at Enginsoft was called up to discuss the use of virtual prototyping in engine development. He discussed the use of Enginsoft’s computational fluid dynamic software which is useful for building a virtual prototype of the combustion process which can in turn provide detailed predictions of emissions, cooling system amongst many other engine subsystems. This is possible because the software allows for very detailed modelling of liquid behaviour. Gaseous fluids are not supported as there is difficulty in modelling the behaviour of an unknown mixture of gases. Research will have to be done in this area to advance this concept.

Several talks were also given on emissions and how tightening regulations are currently affecting trends in powertrain development. Hartwig Busch from the Coventry University Centre for Advanced Low-Carbon Propulsion Systems (FEV) was introduced to discuss some of the challenges emissions regulations pose. More focus is being given to CO emissions as it is being monitored under the EU6d regulations. The trend suggests that CO limits will become more stringent and replace PN emissions as the major emission challenge. The aftertreatment available on production vehicles can curtail the emissions to desired levels but this method is only effective with full combustion cycles at stoichiometry and significant degradation in performance has been observed when the AFR (lambda) is outside stoichiometry. A major factor in this problem is driver behaviour, as more aggressive drivers tend to make quick changes to engine speed and torque which increases the emissions. FEV is using virtualization of calibration to tackle emissions regulations. This involves the use of concepts like Hardware in Loop simulations, road virtualization and driver behaviour modelling. Other strategies are also employed with the goal of keeping lambda at 1 such as water injection and variable compression ratio.

Overall, this seminar offered informative insight into trends in powertrain development and highlighted the commitment of many auto manufacturers to the improvement of the internal combustion engine and powertrain as we are still decades away from full electrification.

Finally the chair and speakers held a Q/A session before giving their closing remarks.

Our cohort 2 researcher Athanasios Pouchias attend the Flow Processes on Composite Materials conference in Luleå Sweden from 30th May - 1st June 2018.

The conference was arranged by the Luleå University of Technology and Swerea SICOMP AB. It took place in June 2018 in the small and beautiful northern city of Luleå, Sweden. FPCM 14 is part of a series of conferences covering the science and engineering of composites manufacturing. It provides a forum for scientists, engineers and designers from both academia and industry to exchange ideas, propose new solutions and promote international collaboration. Also, the conference covered topics from the challenges of graphene as a reinforcement to large-scale processing for composites with complex structures.

 

Fpcm 14

 We asked Athanasios to write about his experience at the conference:

 

“During the conference, I had the opportunity to present the work I had carried out during the first two years of my PhD studies. My research focuses on monitoring the Resin Transfer Moulding (RTM) process which is one of the most promising available technologies for manufacturing large complex three-dimensional parts from composite materials. The RTM process is mostly used in aeronautical, automotive and wind energy applications, such as the manufacturing of wind turbine blades. In this conference, I presented the design methodology for the development of a flow sensor which will be placed to monitor the RTM process.”The main interest of this FPCM conference was the determination of porosity and the characterisation of the permeability of fabrics. A special workshop on the permeability measurements was held during the second day where there was an open discussion on the current technologies and measuring methods for calculating the permeability of composite materials.

The second day finished with an excursion to the Northern Arctic Circle. The Northern Arctic Circle is the ultimate place - from the North Pole - where the sun does not go down at summer solstice and does not go up at the winter solstice."

 

Arctic Circle

 

Our Cohort 2 student, Athanasios Pouchias, attended the NSIRC Conference 2018 in Cambridge which took place between 3rd-4th July 2018.

The National Structural Integrity Research Centre (NSIRC) is a state-of-the-art postgraduate engineering facility established and managed by structural integrity specialist TWI. NSIRC unites academia and industry, working closely with lead academic partner Brunel University London and more than 20 other respected universities, as well as founder sponsors BP and the Lloyd's Register Foundation. The collaborating partners provide academic excellence to address the need for fundamental research, as well as high-quality, industry-relevant training for the next generation of structural integrity engineers.

“It was a great opportunity to present my work at the NSIRC Conference 2018 last week. Over 150 delegates were in attendance, with companies such as Rolls-Royce, Boeing and EDF Energy all represented. Also, attending were NSIRC’s academic partners, which included some of the UK’s top universities, along with researchers, TWI staff and fellow students. I had the fortune to be awarded for the best oral presentation of the 2nd year of my PhD studies.”

Nsirc

 

 

Robin Hamer from cohort 3 attended the Safety-II workshop following on from his previous FRAMily training.

The Safety-II workshop was an additional 2 days after the initial 3 from the FRAMily meeting. It was the first of its kind and was a great success.

There were a wide range of people present from different industries – including practitioners and academics. However, there were only 2 people from the nuclear industry, a theme that was apparent in the FRAMily meeting as well. I enjoyed listening to the various Safety-II related research lectures and it was good to see Safety-II principles being applied to many different industries. However, I still feel there is a lack of practical application and no one presented a Safety-II based tool – there was only classic qualitative research such as interviews themed around the principles of resilience engineering.

In addition, there was still some discussion as to what Safety-II actually is, which I found surprising as the concept makes sense to me. It seemed that most people were coming unstuck with the technicalities of Safety-II rather than the bigger picture i.e. Safety-I and Safety-II are complimentary. Networking with other Safety-II enthusiasts was fun and my PhD seemed to pique the interest of a lot of the practitioners and academics as they were keen to find out how my PhD progresses and what I manage to produce.

Whilst at the entire week-long event, I managed to informally ask my potential interview questions during the networking sessions to as many relevant people as possible. These included: academics, practitioners from the steel, maritime, aviation, nuclear, bank of England, royal mint and healthcare. I had some useful responses however I feel the responses are not representative of a random sample of HFE experts. This is because the event was Safety-II focused so naturally everyone at the event were primarily interested in Safety-II and had bought into the concept so to speak. I have summarised responses to the interview questions below:

1) How is risk assessed within your respective industry?

The general consensus was that risk is still managed reactively i.e. we wait for something to go wrong then focus on this to identify and design out its presumed root cause. In addition to this, it was evident that behavioural safety is another focus whereby good behaviour is reinforced and bad habits are punished, removed or reduced. That is very much classical and operant conditioning, and this seemed to be a ‘fad’ according to conversations with people present.

2) What makes managing risk easy/difficult?

It was agreed that managing risk is near impossible and that it is no longer easy to do. This is because socio-technical systems have gotten so large and complicated that we no longer know what is happening at any one time. This means that situations that may arise and eventually cause harm are latent, meaning that generally tiny random factors and precursors add up and eventually cause an accident/incident. At the moment, we are unaware of what these are and we are unable to identify them. Another theme that was made apparent was Work-as-Imagined and Work-as-Done. Feedback from conversations pointed out that the problem is what management think that workers do and what workers do are two completely different things. Therefore, knowing the general day to day work and processes that take place is impossible since there is a massive discrepancy between WAI and WAD. This is further complicated by the fact there are numerous other terminologies for what really happens and what people say happens e.g. Work-as-Prescribed and Work-as-Disclosed. This further complicated understanding what happens and capturing this is even harder.

3) What are the current main issues in your respective industry?

Again, there was a theme with the answers here. It was generally agreed that Safety-II is the saviour – which isn’t surprising as it was a Safety-II themed week. However, it was also agreed that there is a problem with management. Comments were made that suggest that the workforce who participate in operations (on the ground) really buy into the idea of Safety-II and can see the benefit. However, it was clear that management believe that they were currently doing enough and there simply isn’t the time or resources for the incorporation of Safety-II within industries. I probed further to ask why this may be the case. Again, the theme was that it is not an organisational culture issue. It seems that management and those who have the power to implement these new principles and ideas are now older and inflexible in their approach to safety. For me, this isn’t surprising as there is still no evidence that I am aware which shows the benefits of Safety-II in a real-life case study. Therefore, it would seem that the issue is generational rather than cultural. Despite a few early adopters of the theory, the majority of management are still sceptical about Safety-II and therefore are unwilling to give practitioners and academics free reign to implement and develop methods, approaches and tools.

4) What challenges do they propose to the organisation?

It seems that the issue with the Safety-II principle is that it is marketed in a way that allows people to make the assumption that Safety-II replaces Safety-I. In addition, to understand Safety-II you almost need to rewire your thinking of how accidents/incidents which isn’t easy especially if you have spent years understanding risk and safety in the classic Safety-I way. This provided a challenge, to introduce and communicate Safety-II to management and members of these organisations at a higher level in a way that makes them understand that we need this form of safety, but also to ensure that it is complimentary to Safety-I. I believe this is by ditching the terminology Safety-II and using other more attractive words. People agreed with this and some examples are: ‘Golden Days’, ‘The Art of Work’, ‘Mastering HFE’.

5) What are the main fresh ideas emerging within your respective industries?

Since all suggestions were based around Safety-II and Resilience Engineering I will name and discuss them in the context of Safety-II. It was clear that Safety-II needs to be quantified in some way. Since FRAM – Functional Resonance Analysis Method was main theme of the week, it was interesting to see how this could be quantified using a Bayesian Network to give a likelihood or something going right or wrong in a system based on how important a system component is and how dependent other components are on it. This was interesting as in many different scenarios, a probability can be gained and therefore a 'what if?' question can be quantified. Therefore, it is vital that there is a need for a tool/instrument that has Safety-II and Resilience Engineering principles and moreover, interventions that can be used to reinforce or reduce desirable or undesirable actions/behaviours in everyday work.

6) Is there a need for a new method/tool/instrument?

Strangely enough, people did not think there was a need for a new tool or instrument. I disagree though as I definitely think there is an opportunity for a new tool/instrument from more of a practitioner standpoint. People were quite fond of the FRAM method and thought that could be adapted for use in industry. However, I do think the FRAM method is more an exercise for academics and is not fit for use in industry as a main tool. This is because the FRAM essentially is a tool that identified weaknesses and vulnerabilities in a system which provides that opportunity for discussion. Also, FRAM and the interviews that are needed along with the analysis take a lot of time. Despite this, people thought that FRAM was awesome and quite often, discussion was steered towards how great FRAM is all the different ways how it can be used. I believed though that there is a need for a new method, tool or instrument that can be used alongside FRAM.

Robin Hamer from cohort 3 participated in the 2018 FRAMily tutorial and workshop in Cardiff.

This tutorial and workshop took place in Cardiff School of Engineering and is an annual occurrence in Resilience and Safety-II academics, practitioners and industry. This particular tutorial and workshop concerned a method for analysis complex socio-technical systems called FRAM – Functional Resonance Analysis Method. In the past I have tried to read and apply the information in the books concerning FRAM. However, I found that without instruction and due to the complexity of the method, it was very difficult to implement without guidance. Therefore, the first day proved to be very useful as we went back to basics and learnt about the logic behind the method and how it is used.

In addition, we were introduced to an open source computer program which would allow me to create and edit FRAM models easily. Over this day, I gained a basic understanding about how the model worked and how to create my own models.

On the second and third days, presentations were given by various academics and practitioners. These were mainly either how FRAM was applied in practise, using FRAM in case studies or mixing FRAM with other methods to quantify the outputs. I found these two days slightly overwhelming in terms of knowledge. I thought I had just had a grasp of FRAM but some of the work that was being done with the model really was complicated and it was clear that FRAM has many applications.

One presenter had even used FRAM to model the human brain, which I thought was amazing conceptually. All in all the talks were very good and the questions were stimulating, however, I still have my own reservations about the method. There was little concrete evidence of how you get from the model to spotting weaknesses in the system to then providing interventions. Although this may be obvious to experienced FRAM users, it was not made obvious over the course of the three days. I can see some use of FRAM within my PhD in a sense that it would be good to model work-as-imagined and work-as-done of the tasks I wish to improve using my tool. I feel that by using the tool I will adapt and FRAM together may provide a comprehensive understanding of the tasks and problems within the nuclear industry.

To summarise, I thought the three days were very useful and have given me new ideas for my PhD. However, I still have some reservations about FRAM: it's too complicated and time consuming. Whilst I see value in an academic exercise, I see little use in using FRAM solely for this PhD since it is industry focussed. However, I do see use in FRAM in addition to another tool/instrument. I also managed to network and briefly and informally ask some of my potential interview questions to numerous people. The results weren’t surprising but it was good to see people across various industries all saying the same thing.

follow us
@cdt_ei


General Enquiries Contacts:

Loughborough University

Loughborough, Leicestershire
LE11 3TU
United Kingdom
Tel: +44 (0)1509 227 518
CDT Office
cdt-ei@lboro.ac.uk